A Bounded Divergence Measure Based on The Bhattacharyya Coefficient

نویسندگان

  • Shivakumar Jolad
  • Ahmed Roman
  • Mahesh C. Shastry
چکیده

We introduce a new divergence measure, the bounded Bhattacharyya distance (BBD), for quantifying the dissimilarity between probability distributions. BBD is based on the Bhattacharyya coefficient (fidelity) , and is symmetric, positive semi-definite, and bounded. Unlike the Kullback-Leibler divergence, BBD does not require probability density functions to be absolutely continuous with respect to each other. We show that BBD belongs to the class of Csiszar f-divergence and derive certain relationships between BBD and well known measures such as Bhattacharyya, Hellinger and Jensen-Shannon divergence. Bounds on the Bayesian error probability are established with BBD measure. We show that the curvature of BBD in the parameter space of families of distributions is proportional to the Fisher information. For distributions with vector valued parameters, the curvature matrix can be used to obtain the Rao geodesic distance between them. We also discuss a potential application of probability distance measures in model selection.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...

متن کامل

On Improved Bounds for Probability Metrics and $f$-Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...

متن کامل

A note on decision making in medical investigations using new divergence measures for intuitionistic fuzzy sets

Srivastava and Maheshwari (Iranian Journal of Fuzzy Systems 13(1)(2016) 25-44) introduced a new divergence measure for intuitionisticfuzzy sets (IFSs). The properties of the proposed divergence measurewere studied and the efficiency of the proposed divergence measurein the context of medical diagnosis was also demonstrated. In thisnote, we point out some errors in ...

متن کامل

On Low Distortion Embeddings of Statistical Distance Measures into Low Dimensional Spaces

Statistical distance measures have found wide applicability in information retrieval tasks that typically involve high dimensional datasets. In order to reduce the storage space and ensure efficient performance of queries, dimensionality reduction while preserving the inter-point similarity is highly desirable. In this paper, we investigate various statistical distance measures from the point o...

متن کامل

Measure of non strict singularity of Schechter essential spectrum of two bounded operators and application

In this paper‎, ‎we discuss the essential spectrum of sum of two bounded operators‎ ‎using measure of non strict singularity‎. ‎Based on this new investigation‎, ‎a problem of one-speed neutron transport operator is presented‎.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012